Your browser doesn't support javascript.
Show: 20 | 50 | 100
Results 1 - 1 de 1
Filter
Add filters

Database
Language
Document Type
Year range
1.
1st Workshop on NLP for COVID-19 at the 58th Annual Meeting of the Association for Computational Linguistics, ACL 2020 ; 2020.
Article in English | Scopus | ID: covidwho-2282260

ABSTRACT

Social-science investigations can benefit from a direct comparison of heterogenous corpora: in this work, we compare U.S. state-level COVID-19 policy announcements with policy discussions on Twitter. To perform this task, we require classifiers with high transfer accuracy to both (1) classify policy announcements and (2) classify tweets. We find that co-training using event-extraction views significantly improves the transfer accuracy of our RoBERTa classifier by 3% above a RoBERTa baseline and 11% above other baselines. The same improvements are not observed for baseline views. With a set of 576 COVID-19 policy announcements, hand-labeled into 1 of 6 categories, our classifier observes a maximum transfer accuracy of .77 f1-score on a hand-validated set of tweets. This work represents the first known application of these techniques to an NLP transfer learning task and facilitates cross-corpora comparisons necessary for studies of social science phenomena. © ACL 2020.All right reserved.

SELECTION OF CITATIONS
SEARCH DETAIL